# Distilled BERT

Learn Hf Food Not Food Text Classifier Distilbert Base Uncased
Apache-2.0
A DistilBERT-based text classification model for distinguishing between food and non-food texts
Text Classification Transformers
L
HimanshuGoyal2004
70
1
Moco Sentencedistilbertv2.1
This is a sentence transformer model based on distilled BERT, supporting Korean and English for sentence similarity calculation and feature extraction.
Text Embedding Transformers Supports Multiple Languages
M
bongsoo
37
2
Sbert Chinese General V2 Distill
This is a Chinese sentence embedding model suitable for general semantic matching scenarios. Through distillation technology, it has been reduced from a 12-layer BERT to a 4-layer model, significantly improving inference speed while maintaining good performance.
Text Embedding Transformers
S
DMetaSoul
43
6
Distilbert Base Uncased Emotion
Apache-2.0
A lightweight sentiment analysis model based on DistilBERT, retaining 97% of BERT's language understanding capability through knowledge distillation, with a 40% smaller size and faster inference speed.
Text Classification English
D
bhadresh-savani
99.59k
140
Distilbert Base Indonesian
MIT
This is a distilled version of the Indonesian BERT base model, specifically designed for Indonesian language processing in a case-insensitive format.
Large Language Model Transformers Other
D
cahya
1,815
14
Distilbert Base Fr Cased
Apache-2.0
This is a French-specific version of the multilingual distilled BERT base model, retaining the original model's accuracy while being more compact.
Large Language Model Transformers French
D
Geotrend
1,104
2
Distilbert Base It Cased
Apache-2.0
This is a customized lightweight version of distilbert-base-multilingual-cased, specifically optimized for Italian while maintaining the original accuracy.
Large Language Model Transformers Other
D
Geotrend
14
1
Distilbert Base Th Cased
Apache-2.0
A Thai version customized from the multilingual distilled BERT base model, retaining the original model's accuracy and feature representation capabilities.
Large Language Model Transformers Other
D
Geotrend
50
0
Distilbert Base En No Cased
Apache-2.0
This is a lightweight version of distilbert-base-multilingual-cased, specifically optimized for English and Norwegian, reducing model size while maintaining original accuracy.
Large Language Model Transformers Other
D
Geotrend
13
0
Distilbert Base No Cased
Apache-2.0
This is a compact version of distilbert-base-multilingual-cased, specifically optimized for Norwegian while maintaining the original model's accuracy.
Large Language Model Transformers Other
D
Geotrend
73
0
Distilbert Base Ru Cased
Apache-2.0
This is a compact version of the multilingual distilled BERT base model (case-sensitive), specifically optimized for Russian, capable of generating semantic representations identical to the original model while maintaining its accuracy.
Large Language Model Transformers Other
D
Geotrend
498
2
Distilbert Base Ro Cased
Apache-2.0
This is a customized and compact version of distilbert-base-multilingual-cased for Romanian, fully replicating the original model's representational capabilities with maintained accuracy.
Large Language Model Transformers Other
D
Geotrend
21
0
Distilbert Base Vi Cased
Apache-2.0
This is a customized distilled version of the multilingual DistilBERT base model, optimized for Vietnamese language processing while retaining the original model's representational capabilities and accuracy.
Large Language Model Transformers Other
D
Geotrend
41
1
Distilbert Base En Ru Cased
Apache-2.0
This is a distilled version of distilbert-base-multilingual-cased, specifically optimized for English and Russian, maintaining the original model's accuracy.
Large Language Model Transformers Other
D
Geotrend
26
0
Distilbert Base Da Cased
Apache-2.0
This is a customized compact version of distilbert-base-multilingual-cased, specifically optimized for Danish while maintaining the original model's representation accuracy.
Large Language Model Transformers Other
D
Geotrend
13
0
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase